20 research outputs found

    Evaluating the impact of sample storage, handling, and technical ability on the decay and recovery of SARS-CoV-2 in wastewater

    Get PDF
    Wastewater based epidemiology (WBE) is useful for tracking and monitoring the level of disease prevalence in a community and has been used extensively to complement clinical testing during the current COVID-19 pandemic. Despite the numerous benefits, sources of variability in sample storage, handling, and processing methods can make WBE data difficult to generalize. We performed an experiment to determine sources of variability in WBE data including the impact of storage time, handling, and processing techniques on the concentration of SARS-CoV-2 in wastewater influent from three wastewater treatment plants (WWTP) in North Carolina over 19 days. The SARS-CoV-2 concentration in influent samples held at 4°C did not degrade significantly over the 19-day experiment. Heat pasteurization did not significantly impact the concentration of SARS-CoV-2 at two of the three WWTP but did reduce viral recovery at the WWTP with the smallest population size served. On each processing date, one filter from each sample was processed immediately while a replicate filter was frozen at -80°C. Once processed, filters previously frozen were found to contain slightly higher concentrations (<0.2 log copies/L) than their immediately processed counterparts, indicating freezing filters is a viable method for delayed quantification and may even improve recovery at WWTP with low viral concentrations. Investigation of factors contributing to variability during sample processing indicated that analyst experience level contributed significantly (p<0.001) to accepted droplet generation while extraction efficiency and reverse transcription efficiency contributed significantly (p<0.05) to day-to-day SARS-CoV-2 variability. This study provides valuable practical information for minimizing decay and/or loss of SARS CoV-2 in wastewater influent while adhering to safety procedures, promoting efficient laboratory workflows, and accounting for sources of variability

    Fiber Supplements Derived From Sugarcane Stem, Wheat Dextrin and Psyllium Husk Have Different In Vitro Effects on the Human Gut Microbiota

    Get PDF
    There is growing public interest in the use of fiber supplements as a way of increasing dietary fiber intake and potentially improving the gut microbiota composition and digestive health. However, currently there is limited research into the effects of commercially available fiber supplements on the gut microbiota. Here we used an in vitro human digestive and gut microbiota model system to investigate the effect of three commercial fiber products; NutriKane™, Benefiber® and Psyllium husk (Macro) on the adult gut microbiota. The 16S rRNA gene amplicon sequencing results showed dramatic fiber-dependent changes in the gut microbiota structure and composition. Specific bacterial OTUs within the families Bacteroidaceae, Porphyromonadaceae, Ruminococcaceae, Lachnospiraceae, and Bifidobacteriaceae showed an increase in the relative abundances in the presence of one or more fiber product(s), while Enterobacteriaceae and Pseudomonadaceae showed a reduction in the relative abundances upon addition of all fiber treatments compared to the no added fiber control. Fiber-specific increases in SCFA concentrations showed correlation with the relative abundance of potential SCFA-producing gut bacteria. The chemical composition, antioxidant potential and polyphenolic content profiles of each fiber product were determined and found to be highly variable. Observed product-specific variations could be linked to differences in the chemical composition of the fiber products. The general nature of the fiber-dependent impact was relatively consistent across the individuals, which may demonstrate the potential of the products to alter the gut microbiota in a similar, and predictable direction, despite variability in the starting composition of the individual gut microbiota

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Day-to-day activities of clinical dietitians working in the inpatient and outpatient settings in a group of New South Wales public hospitals: The results of a direct observational study

    No full text
    Aim: To describe the daily tasks undertaken by dietitians working within the inpatient and outpatient sectors within the NSW public hospital system. Methods: This study used an ethnographic methodology that employed a direct, non-participatory, discontinuous, observational technique to observe hospital dietitians, in both outpatient and inpatient settings, during a typical work shift. Trained volunteer observers collected the data over a three-year period (2008, 2009 and 2010). The data were combined and then sorted into five categories including: direct patient care, indirect patient care, communication, administration and education of self or others. Results: A total of 609 hours and 21 minutes were observed across a three-year time period 2008–2010. On average, the dietitians in both inpatient and outpatient settings spent 18.3–32.2% of their time in direct patient care activities. The majority of time was spent in indirect patient care activities such as: information collection, documentation and discussion with other health-care professionals. A comparison between the two work settings showed that those dietitians working in the inpatient setting spent less time in direct patient care (18.3% vs 33.1%, P \u3c 0.05), and more time in indirect care activities (41.7% vs 23.3%, P \u3c 0.05) and in communication about patient care (22.7% vs 14.4%, P \u3c 0.05). Conclusions: The findings show that dietitians spend most of their time doing activities that support patient care, but these activities occur away from the patient

    A practical clinical kinematic model for the upper limbs

    Get PDF
    A novel clinically practical upper limb model is introduced that has been developed through clinical use in children and adults with neurological conditions to guide surgery to the elbow and wrist. This model has a minimal marker set, minimal virtual markers, and no functional joint centres to minimise the demands on the patient and duration of data collection. The model calculates forearm supination independently from the humerus segment, eliminating any errors introduced by poor modelling of the shoulder joint centre. Supination is calculated by defining the forearm segment twice, from the distal and proximal ends: first, using the ulna and radial wrist markers as a segment defining line and second using the medial and lateral elbow markers as a segment defining line. This is comparable to the clinical measurement of supination utilising a goniometer and enables a reduced marker set, with only the elbow, wrist, and hand markers to be applied when only the wrist and forearm angles are of interest. A sensitivity analysis of the calculated elbow flexion–extension angles to the position of the glenohumeral joint centre is performed on one healthy female subject, aged 20 years, during elbow flexion and a forward reaching task. A comparison of the supination angles calculated utilising the novel technique compared to the rotation between the humeral and forearm segments is also given. All angles are compared to a published kinematic model that follows the recommendations of the International Society of Biomechanics. </jats:p
    corecore